A secant-based Nesterov method for convex functions

نویسندگان

  • Razak Alli-Oke
  • William P. Heath
چکیده

A simple secant-based fast gradient method is developed for problems whose objective function is convex and well-defined. The proposed algorithm extends the classical Nesterov gradient method by updating the estimate-sequence parameter with secant information whenever possible. This is achieved by imposing a secant condition on the choice of search point. Furthermore, the proposed algorithm embodies an "update rule with reset" that parallels the restart rule recently suggested in O’Donoghue and Candes (2013). The proposed algorithm applies to a large class of problems including logistic and least-square losses commonly found in the machine learning literature. Numerical results demonstrating the efficiency of the proposed algorithm are analyzed with the aid of performance profiles.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Two Settings of the Dai-Liao Parameter Based on Modified Secant Equations

Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods‎, ‎we introduce two new parameters based on the modified secant equation proposed by Li et al‎. ‎(Comput‎. ‎Optim‎. ‎Appl‎. ‎202:523-539‎, ‎2007) with two approaches‎, ‎which use an extended new conjugacy condition‎. ‎The first is based on a modified descent three-term search direction‎, ‎as the descent Hest...

متن کامل

Math 2400 Lecture Notes: Differential Miscellany

1. L’Hôpital’s Rule 1 1.1. The Cauchy Mean Value Theorem 1 1.2. L’Hôpital’s Rule 2 2. Newton’s Method 5 2.1. Introducing Newton’s Method 5 2.2. A Babylonian Algorithm 6 2.3. Questioning Newton’s Method 7 2.4. Introducing Infinite Sequences 7 2.5. Contractions and Fixed Points 8 2.6. Convergence of Newton’s Method 10 2.7. Quadratic Convergence of Newton’s Method 11 2.8. An example of nonconverge...

متن کامل

Two new conjugate gradient methods based on modified secant equations

Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposedmethods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their acco...

متن کامل

A meshless technique for nonlinear Volterra-Fredholm integral equations via hybrid of radial basis functions

In this paper, an effective technique is proposed to determine thenumerical solution of nonlinear Volterra-Fredholm integralequations (VFIEs) which is based on interpolation by the hybrid ofradial basis functions (RBFs) including both inverse multiquadrics(IMQs), hyperbolic secant (Sechs) and strictly positive definitefunctions. Zeros of the shifted Legendre polynomial are used asthe collocatio...

متن کامل

Linear Systems . Positive transfer functions and convex optimization 1

Recently, a compact characterization of scalar positive polynomials on the real line and on the unit circle was derived by Nesterov 3]. In this paper we show how to extend this result to pseudo-polynomial matrices, and also present a new proof based on the positive real lemma. The characterization is very similar to the scalar case and also allows the use of fast algorithms for computing the ce...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Optimization Letters

دوره 11  شماره 

صفحات  -

تاریخ انتشار 2017